Web Survey Bibliography
Title Humanizing Cues in Internet Surveys: Investigating Respondent Cognitive Processes
Year 2017
Access date 15.09.2017
Abstract In survey methodology, humanizing cues denote the procedures that imitate the interviewer and substitute some of the interviewer tasks (Tourangeau et al. 2003). Presenting a photo, an audio file, a video with the interviewer asking questions, or an animated person is considered the way to mobilize respondents and attract their attention. However, current methodological research on humanizing cues concentrate only on the interviewer effect and the social desirability bias; they do not cover the problem of the cognitive processes that are activated while answering the survey questions (Krosnick 1991; 1999).
This presentation reports on the results from an experiment conducted in November and December 2016 among university students (N=900) as part of the research project funded by the Polish National Science Center. This project aims to estimate the influence of humanizing cues on the quality of the data obtained in internet surveys. Although different data quality indicators were used, in the presentation we refer to those indicators that describe respondents’ tendency to shortcut cognitive processes (satisficing): (a) choosing non-substantive answers to attitude questions; (b)non-differentiation when giving multiple answers on the same response scale; (c) tendency to agree with any assertion, regardless of its content; (d) choosing options expressing approval for status quo; and (e) choosing the first reasonable option. The following types of Internet surveys were used in the experiment: (1) CAWI/text (with all stimuli presented in the form of text); (2) CAWI/photo (with stimuli presented in the form of text and an interviewer photo); and (3) CAWI/movie (with all stimuli presented in the form of video of real interviewers and, additionally, the answers presented in the form of text). Moreover, (4) CAPI was utilized within the experiment as an additional frame of reference. All versions of research tools reflect the growing extent of humanization of the research procedure.
This presentation reports on the results from an experiment conducted in November and December 2016 among university students (N=900) as part of the research project funded by the Polish National Science Center. This project aims to estimate the influence of humanizing cues on the quality of the data obtained in internet surveys. Although different data quality indicators were used, in the presentation we refer to those indicators that describe respondents’ tendency to shortcut cognitive processes (satisficing): (a) choosing non-substantive answers to attitude questions; (b)non-differentiation when giving multiple answers on the same response scale; (c) tendency to agree with any assertion, regardless of its content; (d) choosing options expressing approval for status quo; and (e) choosing the first reasonable option. The following types of Internet surveys were used in the experiment: (1) CAWI/text (with all stimuli presented in the form of text); (2) CAWI/photo (with stimuli presented in the form of text and an interviewer photo); and (3) CAWI/movie (with all stimuli presented in the form of video of real interviewers and, additionally, the answers presented in the form of text). Moreover, (4) CAPI was utilized within the experiment as an additional frame of reference. All versions of research tools reflect the growing extent of humanization of the research procedure.
Access/Direct link Conference Homepage (abstract) / (presentation)
Year of publication2017
Bibliographic typeConferences, workshops, tutorials, presentations
Web survey bibliography - European survey research associaton conference 2017, ESRA, Lisbon (26)
- Effects of sampling procedure on data quality in a web survey; 2017; Rimac, I.; Ogresta, J.
- Paradata as an aide to questionnaire design: Improving quality and reducing burden; 2017; Timm, E.; Stewart, J.; Sidney, I.
- Fieldwork monitoring and managing with time-related paradata; 2017; Vandenplas, C.
- Interviewer effects on onliner and offliner participation in the German Internet Panel; 2017; Herzing, J. M. E.; Blom, A. G.; Meuleman, B.
- Interviewer Gender and Survey Responses: The Effects of Humanizing Cues Variations; 2017; Jablonski, W.; Krzewinska, A.; Grzeszkiewicz-Radulska, K.
- Millennials and emojis in Spain and Mexico.; 2017; Bosch Jover, O.; Revilla, M.
- Where, When, How and with What Do Panel Interviews Take Place and Is the Quality of Answers Affected...; 2017; Niebruegge, S.
- Comparing the same Questionnaire between five Online Panels: A Study of the Effect of Recruitment Strategy...; 2017; Schnell, R.; Panreck, L.
- Nonresponses as context-sensitive response behaviour of participants in online-surveys and their relevance...; 2017; Wetzlehuetter, D.
- Do distractions during web survey completion affect data quality? Findings from a laboratory experiment...; 2017; Wenz, A.
- Predicting Breakoffs in Web Surveys; 2017; Mittereder, F.; West, B. T.
- Measuring Subjective Health and Life Satisfaction with U.S. Hispanics; 2017; Lee, S.; Davis, R.
- Humanizing Cues in Internet Surveys: Investigating Respondent Cognitive Processes; 2017; Jablonski, W.; Grzeszkiewicz-Radulska, K.; Krzewinska, A.
- A Comparison of Emerging Pretesting Methods for Evaluating “Modern” Surveys; 2017; Geisen, E., Murphy, J.
- The Effect of Respondent Commitment on Response Quality in Two Online Surveys; 2017; Cibelli Hibben, K.
- Pushing to web in the ISSP; 2017; Jonsdottir, G. A.; Dofradottir, A. G.; Einarsson, H. B.
- The 2016 Canadian Census: An Innovative Wave Collection Methodology to Maximize Self-Response and Internet...; 2017; Mathieu, P.
- Push2web or less is more? Experimental evidence from a mixed-mode population survey at the community...; 2017; Neumann, R.; Haeder, M.; Brust, O.; Dittrich, E.; von Hermanni, H.
- In search of best practices; 2017; Kappelhof, J. W. S.; Steijn, S.
- Redirected Inbound Call Sampling (RICS); A New Methodology ; 2017; Krotki, K.; Bobashev, G.; Levine, B.; Richards, S.
- An Empirical Process for Using Non-probability Survey for Inference; 2017; Tortora, R.; Iachan, R.
- The perils of non-probability sampling; 2017; Bethlehem, J.
- A Comparison of Two Nonprobability Samples with Probability Samples; 2017; Zack, E. S.; Kennedy, J. M.
- A test of sample matching using a pseudo-web sample; 2017; Chatrchi, G., Gambino, J.
- A Partially Successful Attempt to Integrate a Web-Recruited Cohort into an Address-Based Sample; 2017; Kott, P. S., Farrelly, M., Kamyab, K.
- Nonprobability sampling as model construction; 2017; Mercer, A. W.